Storage capacity and learning capability of quantum neural networks

نویسندگان

چکیده

Abstract We study the storage capacity of quantum neural networks (QNNs), described by completely positive trace preserving (CPTP) maps acting on an N -dimensional Hilbert space. demonstrate that attractor QNNs can store in a non-trivial manner up to linearly independent pure states. For n qubits, reach exponential capacity, O ( 2 n stretchy="false">) , clearly outperforming standard classical whose scales with number neurons . estimate, employing Gardner program, relative volume CPTP M ⩽ stationary states and show this decreases exponentially shrinks zero for ⩾ + 1. generalize our results storing mixed as well input–output relations feed-forward QNNs. Our approach opens path relate properties features This paper is dedicated memory Peter Wittek.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning capability and storage capacity of two-hidden-layer feedforward networks

The problem of the necessary complexity of neural networks is of interest in applications. In this paper, learning capability and storage capacity of feedforward neural networks are considered. We markedly improve the recent results by introducing neural-network modularity logically. This paper rigorously proves in a constructive method that two-hidden-layer feedforward networks (TLFNs) with 2/...

متن کامل

Storage capacity of two-dimensional neural networks.

We investigate the maximum number of embedded patterns in the two-dimensional Hopfield model. The grand state energies of two specific network states, namely, the energies of the pure-ferromagnetic state and the state of specific one stored pattern are calculated exactly in terms of the correlation function of the ferromagnetic Ising model. We also investigate the energy landscape around them a...

متن کامل

Real-time learning capability of neural networks

In some practical applications of neural networks, fast response to external events within an extremely short time is highly demanded and expected. However, the extensively used gradient-descent-based learning algorithms obviously cannot satisfy the real-time learning needs in many applications, especially for large-scale applications and/or when higher generalization performance is required. B...

متن کامل

Phase Diagram and Storage Capacity of Sequence Processing Neural Networks

Abstract. We solve the dynamics of Hopfield-type neural networks which store sequences of patterns, close to saturation. The asymmetry of the interaction matrix in such models leads to violation of detailed balance, ruling out an equilibrium statistical mechanical analysis. Using generating functional methods we derive exact closed equations for dynamical order parameters, viz. the sequence ove...

متن کامل

Phase Diagram and Storage Capacity of Sequence-Storing Neural Networks

We solve the dynamics of Hopfield–type neural networks which store sequences of patterns, close to saturation. The asymmetry of the interaction matrix in such models leads to violation of detailed balance, ruling out an equilibrium statistical mechanical analysis. Using generating functional methods we derive exact closed equations for dynamical order parameters, viz. the sequence overlap and c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Quantum science and technology

سال: 2021

ISSN: ['2364-9054', '2364-9062']

DOI: https://doi.org/10.1088/2058-9565/ac070f